A New Bayesian Lasso.
نویسندگان
چکیده
Park and Casella (2008) provided the Bayesian lasso for linear models by assigning scale mixture of normal (SMN) priors on the parameters and independent exponential priors on their variances. In this paper, we propose an alternative Bayesian analysis of the lasso problem. A different hierarchical formulation of Bayesian lasso is introduced by utilizing the scale mixture of uniform (SMU) representation of the Laplace density. We consider a fully Bayesian treatment that leads to a new Gibbs sampler with tractable full conditional posterior distributions. Empirical results and real data analyses show that the new algorithm has good mixing property and performs comparably to the existing Bayesian method in terms of both prediction accuracy and variable selection. An ECM algorithm is provided to compute the MAP estimates of the parameters. Easy extension to general models is also briefly discussed.
منابع مشابه
Bayesian Quantile Regression with Adaptive Lasso Penalty for Dynamic Panel Data
Dynamic panel data models include the important part of medicine, social and economic studies. Existence of the lagged dependent variable as an explanatory variable is a sensible trait of these models. The estimation problem of these models arises from the correlation between the lagged depended variable and the current disturbance. Recently, quantile regression to analyze dynamic pa...
متن کاملPriors on the Variance in Sparse Bayesian Learning; the demi-Bayesian Lasso
We explore the use of proper priors for variance parameters of certain sparse Bayesian regression models. This leads to a connection between sparse Bayesian learning (SBL) models (Tipping, 2001) and the recently proposed Bayesian Lasso (Park and Casella, 2008). We outline simple modifications of existing algorithms to solve this new variant which essentially uses type-II maximum likelihood to f...
متن کاملA Bayesian Lasso via reversible-jump MCMC
Variable selection is a topic of great importance in high-dimensional statistical modeling and has a wide range of real-world applications. Many variable selection techniques have been proposed in the context of linear regression, and the Lasso model is probably one of the most popular penalized regression techniques. In this paper, we propose a new, fully hierarchical, Bayesian version of the ...
متن کاملSparse kernel learning with LASSO and Bayesian inference algorithm
Kernelized LASSO (Least Absolute Selection and Shrinkage Operator) has been investigated in two separate recent papers [Gao, J., Antolovich, M., & Kwan, P. H. (2008). L1 LASSO and its Bayesian inference. In W. Wobcke, & M. Zhang (Eds.), Lecture notes in computer science: Vol. 5360 (pp. 318-324); Wang, G., Yeung, D. Y., & Lochovsky, F. (2007). The kernel path in kernelized LASSO. In Internationa...
متن کاملBayesian Adaptive Lasso
We propose the Bayesian adaptive Lasso (BaLasso) for variable selection and coefficient estimation in linear regression. The BaLasso is adaptive to the signal level by adopting different shrinkage for different coefficients. Furthermore, we provide a model selection machinery for the BaLasso by assessing the posterior conditional mode estimates, motivated by the hierarchical Bayesian interpreta...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Statistics and its interface
دوره 7 4 شماره
صفحات -
تاریخ انتشار 2014